---
title: Unlock Holdout
description: Holdout, the portion of your data DataRobot reserves when building models, provides an evaluation metric that measures a model's accuracy against the unseen data to validate model quality.

---


# Unlock Holdout {: #unlock-holdout }

The *Holdout* column displays an evaluation metric that measures a model's accuracy against unseen ("new") data. Holdout is calculated using the trained model's predictions on the [holdout partition](data-partitioning). DataRobot reserves a portion of your data to use as holdout (20% by default); it does not train models using this data but instead validates the quality of your models once they have been trained.

!!! tip
    You should only unlock your holdout data after having made all your model-related decisions. **Once your project's holdout has been unlocked, it cannot be re-locked.**

!!! note
        If you run full or Quick Autopilot and DataRobot returns a model [recommended and prepared for deployment](model-rec-process), the specifics described below work slightly differently.

To display a specific model's Holdout score:

1.  Click **Unlock project Holdout for all models** on the rightmost panel.

    ![](images/holdout-unlock.png)

2.  Confirm your decision by clicking **Unlock holdout**.


When you unlock holdout, the label on the project menu changes to **Holdout is unlocked** and a value displays in the Holdout column.

![](images/holdout-unlock-after.png)

Once you have unlocked the holdout data, view the Leaderboard scores on the test data. Then, look at the [Lift Chart](lift-chart). Alternate the **Data Source** dropdown between Validation and Holdout to determine the accuracy of the model's predictions.
